An Improved Tikhonov-Regularized Variable Projection Algorithm for Separable Nonlinear Least Squares

نویسندگان

چکیده

In this work, we investigate the ill-conditioned problem of a separable, nonlinear least squares model by using variable projection method. Based on truncated singular value decomposition method and Tikhonov regularization method, propose an improved which neither discards small values, nor treats all corrections. By fitting Mackey–Glass time series in exponential model, compare three methods, numerically simulated results indicate that is more effective at reducing mean square error solution increasing accuracy unknowns.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variable projection for nonlinear least squares problems

The variable projection algorithm of Golub and Pereyra (1973) has proven to be quite valuable in the solution of nonlinear least squares problems in which a substantial number of the parameters are linear. Its advantages are efficiency and, more importantly, a better likelihood of finding a global minimizer rather than a local one. The purpose of our work is to provide a more robust implementat...

متن کامل

An Efficient Algorithm for the Separable Nonlinear Least Squares Problem

The nonlinear least squares problem miny,z‖A(y)z + b(y)‖, where A(y) is a full-rank (N + `)× N matrix, y ∈ Rn, z ∈ RN and b(y) ∈ RN+` with ` ≥ n, can be solved by first solving a reduced problem miny‖ f (y)‖ to find the optimal value y∗ of y, and then solving the resulting linear least squares problem minz‖A(y∗)z + b(y∗)‖ to find the optimal value z∗ of z. We have previously justified the use o...

متن کامل

Optimal Rates for Regularized Least-squares Algorithm

We develop a theoretical analysis of the generalization performances of regularized least-squares algorithm on a reproducing kernel Hilbert space in the supervised learning setting. The presented results hold in the general framework of vector-valued functions, therefore they can be applied to multi-task problems. In particular we observe that the concept of effective dimension plays a central ...

متن کامل

Fast Rates for Regularized Least-squares Algorithm

We develop a theoretical analysis of generalization performances of regularized leastsquares on reproducing kernel Hilbert spaces for supervised learning. We show that the concept of effective dimension of an integral operator plays a central role in the definition of a criterion for the choice of the regularization parameter as a function of the number of samples. In fact a minimax analysis is...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Axioms

سال: 2021

ISSN: ['2075-1680']

DOI: https://doi.org/10.3390/axioms10030196